-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat/logeuclid mean and logeuclid distance to convex hull #244
Conversation
- add logeuclid mean - set default to cpm_le for QMDM
- fix bug in mean.py
Do you have first results? |
Yes, and it is not working. I guess it will not be that easy to implement ^^ |
@qbarthelemy what do you think of this implementation. Am I missing something? |
@qbarthelemy So it is as the previous implementation but now logm is applied on the COV matrices, then the definition with constrain programming and in the end expm. So it like in pyRiemann: |
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Thanks for the reviewer @qbarthelemy . I will address the comments as soon as possible. I see a difference in the matrices, whether I do: The first option returns matrices with mostly zeros. Then, when I take the exponential of the mean returned by the optimizer, it is always a diagonal matrix. I suspect there is some issue with the optimizer. I see already that Adam is providing better results than Cobyla, when debugging with a classical optimizer. |
- add import for @deprecated
rename le_mean_cpm -> mean_logeuclid_cpm
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
- expose optimizer - rename covmats -> X
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
Hm. The equation should be minimal when the sum of the covariance matrices (Xi) leans towards Y. I cannot remember why this |
Co-authored-by: Quentin Barthélemy <q.barthelemy@gmail.com>
add regularization to test_utils_mean
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm. The equation should be minimal when the sum of the covariance matrices (Xi) leans towards Y.
Yes, you are right. But null weights give the same minimal objective.
Before merging, can you add check_weights
on master version to check if weights were always null, or not?
Furthermore, the objective can become even smaller if weights are negative.
Is it possible to add in the optimizer a constraint on positivity (like in Eq(22) of Zhao2019)?
In fact, tests fail raising the new checking https://github.com/pyRiemann/pyRiemann-qiskit/actions/runs/8061607519/job/22019679900#step:7:401 |
We can still set the lower bound to some minimum, such as 10^-4. However, may be in this case, the pb is with the matrices that are generated for the test. Actually, this test here is redundant: It should be located (and is already implemented in test_utils_distance.py) |
This PR implements the logeuclid mean with constraint programming model, and fixes logeuclid distance to convex hull.